112 research outputs found

    K-DIME: An affective image filtering system

    Get PDF

    The affective body argument in technology design

    Get PDF
    In this paper, I argue that the affective body is underused in the design of interactive technology despite what it has to offer. Whilst the literature shows it to be a powerful affective communication channel, it is often ignored in favor of the more commonly studied facial and vocal expression modalities. This is despite it being as informative and in some situations even more reliable than the other affective channels. In addition, due to the proliferation of increasingly cheaper and ubiquitous movement sensing technologies, the regulatory affective functions of the body could open new possibilities in various application areas. In this paper, after presenting a brief summary of the opportunities that the affective body offers to technology designers, I will use the case of physical rehabilitation to discuss how its use could lead to interesting new solutions and more effective therapies

    RealPen: Providing Realism in Handwriting Tasks on Touch Surfaces using Auditory-Tactile Feedback

    Get PDF
    We present RealPen, an augmented stylus for capacitive tablet screens that recreates the physical sensation of writing on paper with a pencil, ball-point pen or marker pen. The aim is to create a more engaging experience when writing on touch surfaces, such as screens of tablet computers. This is achieved by regenerating the friction-induced oscillation and sound of a real writing tool in contact with paper. To generate realistic tactile feedback, our algorithm analyzes the frequency spectrum of the friction oscillation generated when writing with traditional tools, extracts principal frequencies, and uses the actuator's frequency response profile for an adjustment weighting function. We enhance the realism by providing the sound feedback aligned with the writing pressure and speed. Furthermore, we investigated the effects of superposition and fluctuation of several frequencies on human tactile perception, evaluated the performance of RealPen, and characterized users' perception and preference of each feedback type

    MeTA: Mediated Touch and Affect

    Get PDF
    The main aim of this first workshop on Mediated Touch and Affect (MeTA) is to bring together researchers from diverse communities, such as affective computing, hap tics, augmented reality, communication, design, psychology, human-robot interaction, and telepresence. The goal is to discuss the current state of research on mediated touch and affect and to formulate a research agenda for future directions in research on aspects of the touch-technology-affect triangle

    Musical Expectancy in Squat Sonification For People Who Struggle With Physical Activity

    Get PDF
    Physical activity is important for a healthy lifestyle. However, it can be hard to stay engaged with exercise and this can often lead to avoidance. Sonification has been used to support physical activity through the optimisation/correction of movement. Though previous work has shown how sonification can improve movement execution and motivation, the specific mechanisms of motivation have yet to be investigated in the context of challenging exercises. We investigate the role of music expectancy as a way to leverage people’s implicit and embodied understanding of music within movement sonification to provide information on technique while also motivating continuation of movement and rewarding its completion. The paper presents two studies showing how this musically informed sonification can be used to support the squat movement. The results show how musical expectancy impacted people’s perception of their own movement, in terms of reward, motivation and movement behaviour and the way in which they moved

    Pain level recognition using kinematics and muscle activity for physical rehabilitation in chronic pain

    Get PDF
    People with chronic musculoskeletal pain would benefit from technology that provides run-time personalized feedback and help adjust their physical exercise plan. However, increased pain during physical exercise, or anxiety about anticipated pain increase, may lead to setback and intensified sensitivity to pain. Our study investigates the possibility of detecting pain levels from the quality of body movement during two functional physical exercises. By analyzing recordings of kinematics and muscle activity, our feature optimization algorithms and machine learning techniques can automatically discriminate between people with low level pain and high level pain and control participants while exercising. Best results were obtained from feature set optimization algorithms: 94% and 80% for the full trunk flexion and sit-to-stand movements respectively using Support Vector Machines. As depression can affect pain experience, we included participants' depression scores on a standard questionnaire and this improved discrimination between the control participants and the people with pain when Random Forests were used. / Note: As originally published there is an error in the document. The following information was omitted by the authors: "The project was funded by the EPSRC grant Emotion & Pain Project EP/H017178/1 and Olugbade was supported by the 2012 Nigerian PRESSID PhD funding." The article PDF remains unchanged

    Multimodal Data Fusion based on the Global Workspace Theory

    Get PDF
    We propose a novel neural network architecture, named the Global Workspace Network (GWN), which addresses the challenge of dynamic and unspecified uncertainties in multimodal data fusion. Our GWN is a model of attention across modalities and evolving through time, and is inspired by the well-established Global Workspace Theory from the field of cognitive science. The GWN achieved average F1 score of 0.92 for discrimination between pain patients and healthy participants and average F1 score = 0.75 for further classification of three pain levels for a patient, both based on the multimodal EmoPain dataset captured from people with chronic pain and healthy people performing different types of exercise movements in unconstrained settings. In these tasks, the GWN significantly outperforms the typical fusion approach of merging by concatenation. We further provide extensive analysis of the behaviour of the GWN and its ability to address uncertainties (hidden noise) in multimodal data

    Multi-Rater Consensus Learning for Modeling Multiple Sparse Ratings of Affective Behaviour

    Get PDF
    The use of multiple raters to label datasets is an established practice in affective computing. The principal goal is to reduce unwanted subjective bias in the labelling process. Unfortunately, this leads to the key problem of identifying a ground truth for training the affect recognition system. This problem becomes more relevant in a sparsely-crossed annotation where each rater only labels a portion of the full dataset to ensure a manageable workload per rater. In this paper, we introduce a Multi-Rater Consensus Learning (MRCL) method which learns a representative affect recognition model that accounts for each rater's agreement with the other raters. MRCL combines a multitask learning (MTL) regularizer and a consensus loss. Unlike standard MTL, this approach allows the model to learn to predict each rater's label while explicitly accounting for the consensus among raters. We evaluated our approach on two different datasets based on spontaneous affective body movement expressions for pain behaviour detection and laughter type recognition respectively. The two naturalistic datasets were chosen for the different forms of labelling (different in affect, observation stimuli, and raters) that they together offer for evaluating our approach. Empirical results demonstrate that MRCL is effective for modelling affect from datasets with sparsely-crossed multi-rater annotation

    Author Correction: Embodiment in a Child-Like Talking Virtual Body Influences Object Size Perception, Self-Identification, and Subsequent Real Speaking

    Get PDF
    Correction to: Scientific Reports https://doi.org/10.1038/s41598-017-09497-3, published online 29 August 201
    • …
    corecore